2 Jobs Found
6
Days Ago
Experience
-
10 to 12
Key
Skills -
Databricks & Apache Spark (PySpark),
Python & PL/SQL,
ETL / Data Warehouse Development,
Big Data Technologies: Spark SQL,
DataFrames,
Hive,
Redshift,
Athena,
Cloud Environment: AWS (S3,
Lambda,
Redshift,
Redshift Spectrum,
Athena),
Data Modeling & Solution Design,
SDLC Knowledge: Waterfall,
Agile,
DevOps,
Problem Solving & Analytical Skills,
Communication & Team Mentoring.,
33
Days Ago
Experience
-
3 to 5
Key
Skills -
Python,
PySpark,
Spark DataFrames,
Jupyter Notebook,
PyCharm,
Spark Job Optimization,
Git,
AWS EMR,
Athena,
Glue,
Lambda,
EC2,
S3,
SNS,
Bash/Shell Scripting,
ETL,
CSV,
TSV,
XML,
JSON,
Fixed-width & Multi-record Files,
Data Warehousing,
Star/Snowflake Schemas,
Parquet,
Avro,
ORC,
Snappy,
Gzip,
Aurora,
RDS,
Redshift,
ElastiCache,
DynamoDB,
Jenkins,
DevOps,
Debugging,
US Client Interaction,
Front-end Frameworks,
BFSI Domain,